491 research outputs found

    Canonical Proof nets for Classical Logic

    Full text link
    Proof nets provide abstract counterparts to sequent proofs modulo rule permutations; the idea being that if two proofs have the same underlying proof-net, they are in essence the same proof. Providing a convincing proof-net counterpart to proofs in the classical sequent calculus is thus an important step in understanding classical sequent calculus proofs. By convincing, we mean that (a) there should be a canonical function from sequent proofs to proof nets, (b) it should be possible to check the correctness of a net in polynomial time, (c) every correct net should be obtainable from a sequent calculus proof, and (d) there should be a cut-elimination procedure which preserves correctness. Previous attempts to give proof-net-like objects for propositional classical logic have failed at least one of the above conditions. In [23], the author presented a calculus of proof nets (expansion nets) satisfying (a) and (b); the paper defined a sequent calculus corresponding to expansion nets but gave no explicit demonstration of (c). That sequent calculus, called LK\ast in this paper, is a novel one-sided sequent calculus with both additively and multiplicatively formulated disjunction rules. In this paper (a self-contained extended version of [23]), we give a full proof of (c) for expansion nets with respect to LK\ast, and in addition give a cut-elimination procedure internal to expansion nets - this makes expansion nets the first notion of proof-net for classical logic satisfying all four criteria.Comment: Accepted for publication in APAL (Special issue, Classical Logic and Computation

    A Predictive Model of Cognitive Performance Under Acceleration Stress

    Get PDF
    Extreme acceleration maneuvers encountered in modern agile fighter aircraft can wreak havoc on human physiology thereby significantly influencing cognitive task performance. Increased acceleration causes a shift in local arterial blood pressure and profusion causing declines in regional cerebral oxygen saturation. As oxygen content continues to decline, activity of high order cortical tissue reduces to ensure sufficient metabolic resources are available for critical life-sustaining autonomic functions. Consequently, cognitive abilities reliant on these affected areas suffer significant performance degradations. This goal of this effort was to develop and validate a model capable of predicting human cognitive performance under acceleration stress. An Air Force program entitled, Human Information Processing in Dynamic Environments (HIPDE) evaluated cognitive performance across twelve tasks under various levels of acceleration stress. Data sets from this program were leveraged for model development and validation. Development began with creation of a proportional control cardiovascular model that produced predictions of several hemodynamic parameters including eye-level blood pressure. The relationship between eye-level blood pressure and regional cerebral oxygen saturation (rSO2) was defined and validated with objective data from two different HIPDE experiments. An algorithm was derived to relate changes in rSO2 within specific brain structures to performance on cognitive tasks that require engagement of different brain areas. Data from two acceleration profiles (3 and 7 Gz) in the Motion Inference experiment were used in algorithm development while the data from the remaining two profiles (5 and 7 Gz SACM) verified model predictions. Data from the precision timing experiment were then used to validate the model predicting cognitive performance on the precision timing task as a function of Gz profile. Agreement between the measured and predicted values were defined as a correlation coefficient close to 1, linear best-fit slope on a plot of measured vs. predicted values close to 1, and low mean percent error. Results showed good overall agreement between the measured and predicted values for the rSO2 (Correlation Coefficient: 0.7483-0.8687; Linear Best-Fit Slope: 0.5760-0.9484; Mean Percent Error: 0.75-3.33) and cognitive performance models (Motion Inference Task - Correlation Coefficient: 0.7103-0.9451; Linear Best-Fit Slope: 0.7416-0.9144; Mean Percent Error: 6.35-38.21; Precision Timing Task - Correlation Coefficient: 0.6856 - 0.9726; Linear Best-Fit Slope: 0.5795 - 1.027; Mean Percent Error: 6.30 - 17.28). The evidence suggests that the model is an accurate predictor of cognitive performance under high acceleration stress across tasks, the first such model to be developed. Applications of the model include Air Force mission planning, pilot training, improved adversary simulation, analysis of astronaut launch and reentry profiles, and safety analysis of extreme amusement rides

    Changes in Earnings Differentials in the 1980s: Concordance, Convergence, Causes, and Consequences

    Get PDF
    This paper analyzes changes in U.S. earnings differentials in the 1980s between race, gender, age, and schooling groups. There are four main sets of results to report. First, the economic position of less-educated workers declined relative to the more-educated among almost all demographic groups. Education-earnings differentials clearly rose for whites, but less clearly for blacks, while employment rate differences associated with education increased more for blacks than for whites. Second, much of the change in education-earnings differentials for specific groups is attributable to measurable economic factors: to changes in the occupational or industrial structure of employment; to changes in average wages within industries; to the fall in the real value of the minimum wage and the tall in union density; and to changes in the relative growth rate of more-educated workers. Third, the earnings and employment position of white females, and to a lesser extent of black females, converged to that of white males in the 1980s, across education groups. At the same time, the economic position of more-educated black males appears to have worsened relative to their white-male counterparts. Fourth, there has been a sizable college-enrollment response to the rising relative wages of college graduates. This response suggests that education-earnings differentials may stop increasing, or even start to decline, in the near future.

    "Changes in Earnings Differentials in the 1980s: Concordance, Convergence, Causes, and Consequence"

    Get PDF
    This paper analyzes changes in U.S. earnings differentials in the 1980s between race, gender, age, and schooling groups. There are four main sets of results to report. First, the economic position of less-educated workers declined relative to the more-educated among almost all demographic groups. Education-earnings differentials clearly rose for whites, but less clearly for blacks, while employment rate differences associated with education increased more for blacks than for whites. Second, much of the change in education-earnings differentials for specific groups is attributable to measurable economic factors: to changes in the occupational or industrial structure of employment; to changes in average wages within industries; to the fall in the real value of the minimum wage and the fall in union density; and to changes in the relative growth rate of more educated workers. Third, the earnings and employment position of white females, and to a lesser extent of black females, converged to that of white males in the 1980s, across education groups. At the same time, the economic position of more-educated black males appears to have worsened relative to their white-male counterparts. Fourth, there has been a sizable college-enrollment response to the rising relative wages of college graduates. This response suggests that education-earnings differentials may stop increasing, or even start to decline, in the near future.

    CortexMorph: fast cortical thickness estimation via diffeomorphic registration using VoxelMorph

    Full text link
    The thickness of the cortical band is linked to various neurological and psychiatric conditions, and is often estimated through surface-based methods such as Freesurfer in MRI studies. The DiReCT method, which calculates cortical thickness using a diffeomorphic deformation of the gray-white matter interface towards the pial surface, offers an alternative to surface-based methods. Recent studies using a synthetic cortical thickness phantom have demonstrated that the combination of DiReCT and deep-learning-based segmentation is more sensitive to subvoxel cortical thinning than Freesurfer. While anatomical segmentation of a T1-weighted image now takes seconds, existing implementations of DiReCT rely on iterative image registration methods which can take up to an hour per volume. On the other hand, learning-based deformable image registration methods like VoxelMorph have been shown to be faster than classical methods while improving registration accuracy. This paper proposes CortexMorph, a new method that employs unsupervised deep learning to directly regress the deformation field needed for DiReCT. By combining CortexMorph with a deep-learning-based segmentation model, it is possible to estimate region-wise thickness in seconds from a T1-weighted image, while maintaining the ability to detect cortical atrophy. We validate this claim on the OASIS-3 dataset and the synthetic cortical thickness phantom of Rusak et al.Comment: Accepted (early acceptance) at MICCAI 202

    Beltway: Getting Around Garbage Collection Gridlock

    Get PDF
    We present the design and implementation of a new garbage collection framework that significantly generalizes existing copying collectors. The Beltway framework exploits and separates object age and incrementality. It groups objects in one or more increments on queues called belts, collects belts independently, and collects increments on a belt in first-in-first-out order. We show that Beltway configurations, selected by command line options, act and perform the same as semi-space, generational, and older-first collectors, and encompass all previous copying collectors of which we are aware. The increasing reliance on garbage collected languages such as Java requires that the collector perform well. We show that the generality of Beltway enables us to design and implement new collectors that are robust to variations in heap size and improve total execution time over the best generational copying collectors of which we are aware by up to 40%, and on average by 5 to 10%, for small to moderate heap sizes. New garbage collection algorithms are rare, and yet we define not just one, but a new family of collectors that subsumes previous work. This generality enables us to explore a larger design space and build better collectors

    Nested sampling for Bayesian model comparison in the context of Salmonella disease dynamics.

    Get PDF
    Understanding the mechanisms underlying the observed dynamics of complex biological systems requires the statistical assessment and comparison of multiple alternative models. Although this has traditionally been done using maximum likelihood-based methods such as Akaike's Information Criterion (AIC), Bayesian methods have gained in popularity because they provide more informative output in the form of posterior probability distributions. However, comparison between multiple models in a Bayesian framework is made difficult by the computational cost of numerical integration over large parameter spaces. A new, efficient method for the computation of posterior probabilities has recently been proposed and applied to complex problems from the physical sciences. Here we demonstrate how nested sampling can be used for inference and model comparison in biological sciences. We present a reanalysis of data from experimental infection of mice with Salmonella enterica showing the distribution of bacteria in liver cells. In addition to confirming the main finding of the original analysis, which relied on AIC, our approach provides: (a) integration across the parameter space, (b) estimation of the posterior parameter distributions (with visualisations of parameter correlations), and (c) estimation of the posterior predictive distributions for goodness-of-fit assessments of the models. The goodness-of-fit results suggest that alternative mechanistic models and a relaxation of the quasi-stationary assumption should be considered.RD was funded by the Biotechnology and Biological Sciences Research Council (BBSRC) (grant number BB/I002189/1). TJM was funded by the Biotechnology and Biological Sciences Research Council (BBSRC) (grant number BB/I012192/1). OR was funded by the Royal Society.This paper was originally published in PLOS ONE (Dybowski R, McKinley TJ, Mastroeni P, Restif O, PLoS ONE 2013, 8(12): e82317. doi:10.1371/journal.pone.0082317)
    corecore